Relaxed Analysis Operator Learning

نویسندگان

  • Mehrdad Yaghoobi
  • Mike E. Davies
چکیده

The problem of analysis operator learning can be formulated as a constrained optimisation problem. This problem has been approximately solved using projected gradient or geometric gradient descent methods. We will propose a relaxation for the constrained analysis operator learning in this paper. The relaxation has been suggested here to, a) reduce the computational complexity of the optimisation and b) include a larger set of admissible operators. We will show here that an appropriate relaxation can be useful in presenting a projection-free optimisation algorithm, while preventing the problem to become ill-posed. The relaxed optimisation objective is not convex and it is thus not always possible to find the global optimum. However, when a rich set of training samples are given, we empirically show that the desired synthetic analysis operator is recoverable, using the introduced sub-gradient descent algorithm.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Generalized ( a , Η ) − Resolvent Operator Technique and Sensitivity Analysis for Relaxed Cocoercive Variational Inclusions

Sensitivity analysis for relaxed cocoercive variational inclusions based on the generalized resolvent operator technique is discussed The obtained results are general in nature.

متن کامل

Relaxed resolvent operator for solving a variational inclusion problem

In this paper, we introduce a new resolvent operator and we call it relaxed resolvent operator. We prove that relaxed resolvent operator is single-valued and Lipschitz continuous and finally we approximate the solution of a variational inclusion problem in Hilbert spaces by defining an iterative algorithm based on relaxed resolvent operator. A few concepts like Lipschitz continuity and strong m...

متن کامل

On Over-Relaxed Proximal Point Algorithms for Generalized Nonlinear Operator Equation with (A,η,m)-Monotonicity Framework

In this paper, a new class of over-relaxed proximal point algorithms for solving nonlinear operator equations with (A,η,m)-monotonicity framework in Hilbert spaces is introduced and studied. Further, by using the generalized resolvent operator technique associated with the (A,η,m)-monotone operators, the approximation solvability of the operator equation problems and the convergence of iterativ...

متن کامل

The Introduction of a Heuristic Mutation Operator to Strengthen the Discovery Component of XCS

The extended classifier systems (XCS) by producing a set of rules is (classifier) trying to solve learning problems as online. XCS is a rather complex combination of genetic algorithm and reinforcement learning that using genetic algorithm tries to discover the encouraging rules and value them by reinforcement learning. Among the important factors in the performance of XCS is the possibility to...

متن کامل

On over-relaxed (A,η,m)-proximal point algorithm frameworks with errors and applications to general variational inclusion problems

The purpose of this paper is to provide some remarks for the main results of the paper Verma (Appl. Math. Lett. 21:142-147, 2008). Further, by using the generalized proximal operator technique associated with the (A,η,m)-monotone operators, we discuss the approximation solvability of general variational inclusion problem forms in Hilbert spaces and the convergence analysis of iterative sequence...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2012